Skip to content

【Hackathon 8th No.23】Improved Training of Wasserstein GANs 论文复现 #1147

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 48 commits into from

Conversation

robinbg
Copy link

@robinbg robinbg commented Apr 27, 2025

PR types

PR changes

Describe

Copy link

paddle-bot bot commented Apr 27, 2025

Thanks for your contribution!

Copy link
Contributor

@lijialin03 lijialin03 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

感谢提交~
当前代码直接运行时会报错,请先检查参数/代码,谢谢

import paddle.nn as nn
import paddle.vision.transforms as transforms

from ..models.wgan_gp import WGAN_GP
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

运行时找会不到文件,改为动态修正的路径

ROOT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
sys.path.append(ROOT_DIR)
from models.wgan_gp import WGAN_GP

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

但是这样通不过code style check

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

嗯嗯好的,那就先关注跑通吧,我现在运行好像还是跑不通的

nn.Linear(noise_dim, 512 * 4 * 4),
nn.BatchNorm1D(512 * 4 * 4),
nn.ReLU(),
lambda x: x.reshape([-1, 512, 4, 4]),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里运行时会报错,Paddle 的 nn.Sequential 只能包含继承自 nn.Layer 的对象,不能包含 lambda 表达式,会触发 assert isinstance(layer, Layer) 错误,可以改为类似如下代码:

def __init__(self, noise_dim=100, output_channels=3):
      super(CIFAR10Generator, self).__init__()

      self.layers1 = nn.Sequential(
          nn.Linear(noise_dim, 512 * 4 * 4),
          nn.BatchNorm1D(512 * 4 * 4),
          nn.ReLU(),
      )
      self.layers2 = nn.Sequential(
          nn.Conv2DTranspose(512, 256, 4, 2, 1),
          nn.BatchNorm2D(256),
          nn.ReLU(),
          nn.Conv2DTranspose(256, 128, 4, 2, 1),
          nn.BatchNorm2D(128),
          nn.ReLU(),
          nn.Conv2DTranspose(128, output_channels, 4, 2, 1),
          nn.Tanh(),
      )

    def forward(self, x):
        x = self.layers1(x)
        x = x.reshape([-1, 512, 4, 4])
        x = self.layers2(x)
        return x

real_output = self.discriminator(real_data)
fake_output = self.discriminator(fake_data)

gp = self.gradient_penalty(real_data, fake_data)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

运行过程中报错:RuntimeError: (Unavailable) The Op flatten_grad doesn't have any grad op. If you don't intend calculating higher order derivatives, please set create_graphto False. (at ../paddle/fluid/eager/api/generated/eager_generated/backwards/nodes.cc:15349)

@lijialin03
Copy link
Contributor

lijialin03 commented May 16, 2025

以cifar这个case为例,目前我这边验证的状态是:

  1. 下面这两行代码会报错
    efa64bdb999f3c522af6e902742c44ef
    40615c3e78e2397f632f4f9a5debc1ee
  2. 临时注释掉之后继续运行,发现 loss 非常大
    image

@luotao1
Copy link
Collaborator

luotao1 commented May 23, 2025

@luotao1 luotao1 closed this May 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants